55 research outputs found
Measuring the quantum state of a single system with minimum state disturbance
Conventionally, unknown quantum states are characterized using quantum-state
tomography based on strong or weak measurements carried out on an ensemble of
identically prepared systems. By contrast, the use of protective measurements
offers the possibility of determining quantum states from a series of weak,
long measurements performed on a single system. Because the fidelity of a
protectively measured quantum state is determined by the amount of state
disturbance incurred during each protective measurement, it is crucial that the
initial quantum state of the system is disturbed as little as possible. Here we
show how to systematically minimize the state disturbance in the course of a
protective measurement, thus enabling the maximization of the fidelity of the
quantum-state measurement. Our approach is based on a careful tuning of the
time dependence of the measurement interaction and is shown to be dramatically
more effective in reducing the state disturbance than the previously considered
strategy of weakening the measurement strength and increasing the measurement
time. We describe a method for designing the measurement interaction such that
the state disturbance exhibits polynomial decay to arbitrary order in the
inverse measurement time . We also show how one can achieve even faster,
subexponential decay, and we find that it represents the smallest possible
state disturbance in a protective measurement. In this way, our results show
how to optimally measure the state of a single quantum system using protective
measurements.Comment: 7 pages, 4 figures, identical to published versio
No-go theorem for the composition of quantum systems
Building on the Pusey-Barrett-Rudolph theorem, we derive a no-go theorem for
a vast class of deterministic hidden-variables theories, including those
consistent on their targeted domain. The strength of this result throws doubt
on seemingly natural assumptions (like the "preparation independence" of the
Pusey-Barrett-Rudolph theorem) about how "real states" of subsystems compose
for joint systems in nonentangled states. This points to constraints in
modeling tensor-product states, similar to constraints demonstrated for more
complex states by the Bell and Bell-Kochen-Specker theorems.Comment: 4 pages. v2: new title, significant revisions. v3: condensed, matches
final published versio
What classicality? Decoherence and Bohr's classical concepts
Niels Bohr famously insisted on the indispensability of what he termed
"classical concepts." In the context of the decoherence program, on the other
hand, it has become fashionable to talk about the "dynamical emergence of
classicality" from the quantum formalism alone. Does this mean that decoherence
challenges Bohr's dictum -- for example, that classical concepts do not need to
be assumed but can be derived? In this paper, we'll try to shed some light down
the murky waters where formalism and philosophy mingle. To begin, we'll clarify
the notion of classicality in the decoherence description. We'll then discuss
Bohr's and Heisenberg's takes on the quantum-classical problem and reflect on
the different meanings of the terms "classicality" and "classical concepts" in
the writings of Bohr and his followers. This analysis will allow us to put
forward some tentative suggestions for how we may better understand the
relation between decoherence-induced classicality and Bohr's classical
concepts.Comment: 6 page
Scheme for the protective measurement of a single photon using a tunable quantum Zeno effect
This paper presents a proof-of-principle scheme for the protective measurement of a single photon. In this scheme, the photon is looped arbitrarily many times through an optical stage that implements a weak measurement of a polarization observable followed by a strong measurement protecting the state. The ability of this scheme to realize a large number of such interaction{protection steps means that the uncertainty in the measurement result can be drastically reduced while maintaining a sufficient probability for the photon to survive the measurement
- …